AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
1.3B lightweight and efficient

# 1.3B lightweight and efficient

Llama 1B Dj Refine 150B
Apache-2.0
Based on the OpenLLaMA architecture, this large language model is pre-trained on Data-Juicer refined RedPajama and Pile datasets, outperforming other models of the same 1.3B parameter scale.
Large Language Model Transformers
L
datajuicer
2,834
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase